05 - Camera optics, calibration, and stereovision

Robotics I

Poznan University of Technology, Institute of Robotics and Machine Intelligence

Laboratory 5: Camera optics, calibration, and stereovision

Goals

The objectives of this laboratory are to:

Resources

Part I: Camera optics


Source: Face distortion is not due to lens distortion

Each camera has a lens that focuses light onto the image sensor. The lens is defined by its focal length, the distance between the lens and the image sensor when focused at infinity. The focal length determines the camera’s field of view, which is the angular extent visible to the camera.

Why do we need camera calibration?

Every camera, regardless of cost, exhibits some distortion due to the lens and image sensor. Distortion is classified into two types:

Camera calibration can correct distortion. It also determines the camera’s intrinsic and extrinsic parameters, enabling the projection of 3D points onto the image plane and supporting computer vision applications.

Camera calibration

The camera calibration process involves capturing images of a calibration pattern and determining the relationship between its 3D points and their corresponding 2D projections in the image. The calibration pattern can be a chessboard, a circle grid, or any other pattern with known dimensions:


Examples of patterns. Source: Own materials

Note: Before starting the calibration process, determine the calibration pattern’s size and measure the relevant dimensions (in meters). The required measurements depend on the pattern type: * Chessboard: Count the number of inner corners (where black squares meet), e.g., 8×5, and measure the square size, e.g., 0.1 m, * Circle grid: Count the number of circles in the first two rows and the last column, e.g., 9×3, and measure the distance between two neighbouring circles, e.g., 0.21 m.

As a result of the calibration process, we can obtain:

Part II: Camera calibration in ROS 2

For calibration, we will use the camera_calibration package in ROS 2. This package provides a graphical user interface to determine the camera’s intrinsic and extrinsic parameters. While calibration typically involves capturing images in live mode, we will use a bag file with pre-recorded images to simplify the process.

In ROS 2, topic names are standardized to facilitate communication between nodes. The following table lists the standard topic names for camera data:

Topic name Comment
image_raw Raw data from the camera driver, possibly Bayer encoded
image monochrome (grayscale), source image (distorted)
image_color color (RGB or BGR), source image (distorted)
image_rect monochrome (grayscale), rectified image (without distortions)
image_rect_color color (RGB or BGR), rectified image (without distortions)

Note: Everything we do today should be done inside the container!

💥 💥 💥 Task 💥 💥 💥


Chessboard used for calibration. Source: Own materials

Note: The container should set up the ROS 2 environment automatically. If not, run the following commands: source /opt/ros/jazzy/setup.bash to source the ROS 2 environment, and export ROS_AUTOMATIC_DISCOVERY_RANGE=LOCALHOST to limit the topic discovery to the local machine.

colcon build --packages-select robotics_camera_calibration
source install/setup.bash
ros2 bag info data/calibration_bag/
ros2 run camera_calibration cameracalibrator \
    -p chessboard \
    --size <DEFINE_PATTERN_SIZE> \
    --square 0.065 \
    --no-service-check \
    --ros-args \
    -r image:=<DEFINE_IMAGE_TOPIC_LEFT_CAMERA>
docker exec -it ros2_calibration bash
ros2 bag play data/calibration_bag/


When calibration is finished, the calibrate button is green. Source: Own materials

Note: If the bag player is finished and the calibration process is not completed, you can play the bag again to continue the calibration process. You can play the bag with slow motion to improve the calibration results using the --rate option (0.8 or 0.6).

Part III: Stereovision, double-cameras calibration, and disparity map in ROS 2


An example stereovision camera. Source: Own materials

Stereovision uses two cameras to extract 3D information from a scene. Calibration determines the cameras’ relative position and orientation, allowing for disparity map computation.

A Disparity map is a 2D representation of a scene that encodes the pixel coordinate differences between the left and right images. It is used to compute the depth map, which represents the distance between the camera and objects in the scene.

Epipolar geometry is a fundamental concept in stereovision that describes the relationship between the two cameras. It is defined by the epipolar line, which is the line connecting the camera centers and the corresponding points in the left and right images.

Depth map is a 2D representation of the scene that shows the distance between the camera and the objects in the scene. It is calculated from the disparity map using the camera calibration parameters.

The baseline is the distance between two cameras. It influences the disparity map resolution and depth map accuracy. A larger baseline enables measuring greater distances but increases the minimum measurable distance.

💥 💥 💥 Task 💥 💥 💥

docker exec -it ros2_calibration bash
colcon build --packages-select robotics_camera_calibration
source install/setup.bash
ros2 run camera_calibration cameracalibrator \
    -p chessboard \
    --size <DEFINE_PATTERN_SIZE> \
    --square 0.065 \
    --no-service-check \
    --ros-args \
    -r left:=<DEFINE_IMAGE_TOPIC_LEFT_CAMERA> \
    -r right:=<DEFINE_IMAGE_TOPIC_RIGHT_CAMERA>
docker exec -it ros2_calibration bash
ros2 bag play data/calibration_bag/
mkdir calibrationdata && tar xvf /tmp/calibrationdata.tar.gz -C calibrationdata
cat calibrationdata/left.yaml
cat calibrationdata/right.yaml

Note the differences in the projection matrices: for the right camera, the translation in the X direction (first row, last column) should be negative. Dividing this value by the focal length in the X direction determines the stereo camera baseline.

colcon build --packages-select robotics_camera_calibration
docker exec -it ros2_calibration bash
source install/setup.bash
ros2 launch robotics_camera_calibration stereo_image_proc.launch.py
ros2 run robotics_camera_calibration stereo_info_publisher
ros2 run image_view stereo_view \
    _approximate_sync:=True \
    --ros-args \
    -r /stereo/left/image:=<DEFINE_IMAGE_TOPIC_RECT_LEFT_CAMERA> \
    -r /stereo/right/image:=<DEFINE_IMAGE_TOPIC_RECT_RIGHT_CAMERA> \
    -r /stereo/disparity:=/disparity
RUN bash src/robotics_camera_calibration/scripts/download_corridor_bag.bash
ros2 bag play data/corridor_bag/ --loop

You should see the disparity map, which represents the pixel differences between the left and right images.


An example disparity image. Source: Own materials

ros2 run rqt_reconfigure rqt_reconfigure --force-discover

The stereovision node provides a set of parameters that can be adjusted in real time. Tune these parameters to optimize disparity map quality. Tips for setting them can be found here.


The tool enables real-time parameter changes. Source: Own materials

💥 💥 💥 Assignment 💥 💥 💥

To pass the course, a screenshot confirming the work done must be uploaded to the eCourses platform. It must include: